52 research outputs found
Resources for bosonic quantum computational advantage
Quantum computers promise to dramatically outperform their classical
counterparts. However, the non-classical resources enabling such computational
advantages are challenging to pinpoint, as it is not a single resource but the
subtle interplay of many that can be held responsible for these potential
advantages. In this work, we show that every bosonic quantum computation can be
recast into a continuous-variable sampling computation where all computational
resources are contained in the input state. Using this reduction, we derive a
general classical algorithm for the strong simulation of bosonic computations,
whose complexity scales with the non-Gaussian stellar rank of both the input
state and the measurement setup. We further study the conditions for an
efficient classical simulation of the associated continuous-variable sampling
computations and identify an operational notion of non-Gaussian entanglement
based on the lack of passive separability, thus clarifying the interplay of
bosonic quantum computational resources such as squeezing, non-Gaussianity and
entanglement.Comment: 12 pages, 1 figure. Comments are welcome
Mode-Dependent Loss Model for Multimode Photon-Subtracted States
Multimode photon-subtraction provides an experimentally feasible option to
construct large non-Gaussian quantum states in continuous-variable quantum
optics. The non-Gaussian features of the state can lead towards the more exotic
aspects of quantum theory, such as negativity of the Wigner function. However,
the pay-off for states with such delicate quantum properties is their
sensitivity to decoherence. In this paper, we present a general model that
treats the most important source of decoherence in a purely optical setting:
losses. We use the framework of open quantum systems and master equations to
describe losses in n-photon-subtracted multimode states, where each photon can
be subtracted in an arbitrary mode. As a main result, we find that
mode-dependent losses and photon-subtraction generally do not commute. In
particular, the losses do not only reduce the purity of the state, they also
change the modal structure of its non-Gaussian features. We then conduct a
detailed study of single-photon subtraction from a multimode Gaussian state,
which is a setting that lies within the reach of present-day experiments.Comment: 14 pages, 8 figure
A Statistical Theory of Designed Quantum Transport Across Disordered Networks
We explain how centrosymmetry, together with a dominant doublet in the local
density of states, can guarantee interference-assisted, strongly enhanced,
strictly coherent quantum excitation transport between two predefined sites of
a random network of two-level systems. Starting from a generalisation of the
chaos assisted tunnelling mechanism, we formulate a random matrix theoretical
framework for the analytical prediction of the transfer time distribution, of
lower bounds of the transfer efficiency, and of the scaling behaviour of
characteristic statistical properties with the size of the network.Comment: 23 pages, 8 figure
Validating multi-photon quantum interference with finite data
Multi-particle interference is a key resource for quantum information
processing, as exemplified by Boson Sampling. Hence, given its fragile nature,
an essential desideratum is a solid and reliable framework for its validation.
However, while several protocols have been introduced to this end, the approach
is still fragmented and fails to build a big picture for future developments.
In this work, we propose an operational approach to validation that encompasses
and strengthens the state of the art for these protocols. To this end, we
consider the Bayesian hypothesis testing and the statistical benchmark as most
favorable protocols for small- and large-scale applications, respectively. We
numerically investigate their operation with finite sample size, extending
previous tests to larger dimensions, and against two adversarial algorithms for
classical simulation: the Mean-Field sampler and the Metropolized Independent
Sampler. To evidence the actual need for refined validation techniques, we show
how the assessment of numerically simulated data depends on the available
sample size, as well as on the internal hyper-parameters and other practically
relevant constraints. Our analyses provide general insights into the challenge
of validation, and can inspire the design of algorithms with a measurable
quantum advantage.Comment: 10 pages, 7 figure
- …